翻訳と辞書
Words near each other
・ "O" Is for Outlaw
・ "O"-Jung.Ban.Hap.
・ "Ode-to-Napoleon" hexachord
・ "Oh Yeah!" Live
・ "Our Contemporary" regional art exhibition (Leningrad, 1975)
・ "P" Is for Peril
・ "Pimpernel" Smith
・ "Polish death camp" controversy
・ "Pro knigi" ("About books")
・ "Prosopa" Greek Television Awards
・ "Pussy Cats" Starring the Walkmen
・ "Q" Is for Quarry
・ "R" Is for Ricochet
・ "R" The King (2016 film)
・ "Rags" Ragland
・ ! (album)
・ ! (disambiguation)
・ !!
・ !!!
・ !!! (album)
・ !!Destroy-Oh-Boy!!
・ !Action Pact!
・ !Arriba! La Pachanga
・ !Hero
・ !Hero (album)
・ !Kung language
・ !Oka Tokat
・ !PAUS3
・ !T.O.O.H.!
・ !Women Art Revolution


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

conditional mutual information : ウィキペディア英語版
conditional mutual information

In probability theory, and in particular, information theory, the conditional mutual information is, in its most basic form, the expected value of the mutual information of two random variables given the value of a third.
==Definition==
For discrete random variables X, Y, and Z, we define
:I(X;Y|Z) = \mathbb E_Z \big(I(X;Y)|Z\big)
= \sum_ p_Z(z) \sum_ \sum_
p_(x,y|z) \log \frac(y|z)},
where the marginal, joint, and/or conditional probability mass functions are denoted by p with the appropriate subscript. This can be simplified as
:I(X;Y|Z) = \sum_ \sum_ \sum_
p_(x,y,z) \log \frac(y,z)}.
Alternatively, we may write〔K. Makarychev et al. ''A new class of non-Shannon-type inequalities for entropies.'' Communications in Information and Systems, Vol. 2, No. 2, pp. 147–166, December 2002 (PDF )〕 in terms of joint and conditional entropies as
:I(X;Y|Z) = H(X,Z) + H(Y,Z) - H(X,Y,Z) - H(Z)
= H(X|Z) - H(X|Y,Z).
This can be rewritten to show its relationship to mutual information
:I(X;Y|Z) = I(X;Y,Z) - I(X;Z)
usually rearranged as the chain rule for mutual information
:I(X;Y,Z) = I(X;Z) + I(X;Y|Z)
Another equivalent form of the above is
:I(X;Y|Z) = H(Z|X) + H(X) + H(Z|Y) + H(Y) - H(Z|X,Y) - H(X,Y) - H(Z)
= I(X;Y) + H(Z|X) + H(Z|Y) - H(Z|X,Y) - H(Z)
Conditioning on a third random variable may either increase or decrease the mutual information: that is, the difference I(X;Y|Z) - I(X;Y), called the interaction information, may be positive, negative, or zero, but it is always true that
:I(X;Y|Z) \ge 0
for discrete, jointly distributed random variables ''X'', ''Y'', ''Z''. This result has been used as a basic building block for proving other inequalities in information theory, in particular, those known as Shannon-type inequalities.
Like mutual information, conditional mutual information can be expressed as a Kullback–Leibler divergence:
: I(X;Y|Z) = D_ p( Z=z ) D_ p( Y=y ) D_{\mathrm{KL}}(p(X,Z|y) \| p(X|Z)p(Z|y) ).

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「conditional mutual information」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.